Stagewise Weak Gradient Pursuits Part II: Theoretical Properties

نویسندگان

  • Thomas Blumensath
  • Mike E. Davies
چکیده

In a recent paper [2] we introduced the greedy Gradient Pursuit framework. This is a family of algorithms designed to find sparse solutions to underdetermined inverse problems. One particularly powerful member of this family is the (approximate) conjugate gradient pursuit algorithm, which was shown to be applicable to very large data sets and which was found to perform nearly as well as the traditional Orthogonal Matching Pursuit algorithm. In Part I of this paper [3], we have further extended the Gradient Pursuit framework and introduced a greedier stagewise weak selection strategy that selects several elements per iteration. Combining the conjugate gradient update of [2] with this selection strategy led to a very fast algorithm, applicable to large scale problems, which was shown in [3] to perform well in a wide range of applications. In this paper we study theoretical properties of the approach. In particular, we propose a novel fast recursion to calculate the conjugate gradient update direction and present a proof that shows that this update is guaranteed to be better than a simple gradient update. The other contribution of this paper is to derive theoretic guarantees that give conditions under which the stagewise weak selection strategy can be used to exactly recover sparse vectors from a few measurements, that is, guarantees under which the algorithm is guaranteed to find the sparsest solution.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Stagewise Weak Gradient Pursuits Part I: Fundamentals and Numerical Studies

Finding sparse solutions to underdetermined inverse problems is a fundamental challenge encountered in a wide range of signal processing applications, from signal acquisition to source separation. Recent theoretical advances in our understanding of this problem have further increased interest in their application to various domains. In many areas, such as for example medical imaging or geophysi...

متن کامل

AdaBoost and Forward Stagewise Regression are First-Order Convex Optimization Methods

Boosting methods are highly popular and effective supervised learning methods which combine weak learners into a single accurate model with good statistical performance. In this paper, we analyze two well-known boosting methods, AdaBoost and Incremental Forward Stagewise Regression (FSε), by establishing their precise connections to the Mirror Descent algorithm, which is a first-order method in...

متن کامل

A general framework for fast stagewise algorithms

Forward stagewise regression follows a very simple strategy for constructing a sequence of sparse regression estimates: it starts with all coefficients equal to zero, and iteratively updates the coefficient (by a small amount ) of the variable that achieves the maximal absolute inner product with the current residual. This procedure has an interesting connection to the lasso: under some conditi...

متن کامل

Commentary on : ‘ ‘ Piaget ’ s stages : the unfinished symphony of cognitive development ’ ’ by D . H . Feldman

Dr. Feldman’s paper constitutes a worthy continuation of the select pedigree of thorough neo-Piagetian theory constructions. It presents a general theory of stagewise cognitive development, in the spirit of Piaget’s naturalized epistemology. Theory construction at the general level at which Piaget operated has, for good reasons, an essential place in cognitive developmental psychology. In parti...

متن کامل

A Robust Boosting Method for Mislabeled Data

Abstract We propose a new, robust boosting method by using a sigmoidal function as a loss function. In deriving the method, the stagewise additive modelling methodology is blended with the gradient descent algorithms. Based on intensive numerical experiments, we show that the proposed method is actually better than AdaBoost and other regularized method in test error rates in the case of noisy, ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2008